Last data update: May 06, 2024. (Total: 46732 publications since 2009)
Records 1-27 (of 27 Records) |
Query Trace: Maier A[original query] |
---|
Consensus on the key characteristics of immunotoxic agents as a basis for hazard identification
Germolec DR , Lebrec H , Anderson SE , Burleson GR , Cardenas A , Corsini E , Elmore SE , Kaplan BLF , Lawrence BP , Lehmann GM , Maier CC , McHale CM , Myers LP , Pallardy M , Rooney AA , Zeise L , Zhang L , Smith MT . Environ Health Perspect 2022 130 (10) 105001 BACKGROUND: Key characteristics (KCs), properties of agents or exposures that confer potential hazard, have been developed for carcinogens and other toxicant classes. KCs have been used in the systematic assessment of hazards and to identify assay and data gaps that limit screening and risk assessment. Many of the mechanisms through which pharmaceuticals and occupational or environmental agents modulate immune function are well recognized. Thus KCs could be identified for immunoactive substances and applied to improve hazard assessment of immunodulatory agents. OBJECTIVES: The goal was to generate a consensus-based synthesis of scientific evidence describing the KCs of agents known to cause immunotoxicity and potential applications, such as assays to measure the KCs. METHODS: A committee of 18 experts with diverse specialties identified 10 KCs of immunotoxic agents, namely, 1) covalently binds to proteins to form novel antigens, 2) affects antigen processing and presentation, 3) alters immune cell signaling, 4) alters immune cell proliferation, 5) modifies cellular differentiation, 6) alters immune cell-cell communication, 7) alters effector function of specific cell types, 8) alters immune cell trafficking, 9) alters cell death processes, and 10) breaks down immune tolerance. The group considered how these KCs could influence immune processes and contribute to hypersensitivity, inappropriate enhancement, immunosuppression, or autoimmunity. DISCUSSION: KCs can be used to improve efforts to identify agents that cause immunotoxicity via one or more mechanisms, to develop better testing and biomarker approaches to evaluate immunotoxicity, and to enable a more comprehensive and mechanistic understanding of adverse effects of exposures on the immune system. https://doi.org/10.1289/EHP10800. |
Impact of ergonomic posture on the chemical exposure of workers in the petroleum and chemical industry
Whitehead C , Maier MA , Rao MB , Eturki M , Snawder JE , Davis KG . Ann Work Expo Health 2022 66 (8) 1022-1032 OBJECTIVES: Despite a rise in automation, workers in the petroleum refining and petrochemical manufacturing industry are potentially exposed to various chemicals through inhalation while performing routine job duties. Many factors contribute to the degree of exposure experienced in this setting. The study objective was to characterize the impact of workplace conditions, anthropometric variability, and task orientation on exposure for a simulated routine operations task. METHODS: A chemical exposure laboratory simulation study was designed to evaluate the dependent variable of chemical exposure level in the breathing zone for methane and sulfur hexafluoride. The independent variables were (i) posture of the worker, (ii) worker anthropometry, (iii) process configuration, and (iv) gas density. RESULTS: Pipe height was a significant predictor of gas concentration measured in the breathing zone when located in a position that encouraged the gas to enter the breathing zone of the worker. Worker anthropometry had a major impact; tall worker's (male) chemical concentrations exceeded those of the short worker (female) for methane simulations but the opposite resulted for sulfur hexafluoride. Also, worker posture had a significant impact on gas exposure where nonneutral postures were found to have higher levels of chemical concentration. CONCLUSIONS: The study findings indicate that the breathing zone location is altered by posture and worker height, which changes the exposures relative to the emission source depending on the gas density of the chemicals that are present. As a result, qualitative risk assessment cannot be performed accurately without accounting for these factors. Practically, controls may need to account for worker size differences and posture adaptations. |
Rapid review of dermal penetration and absorption of inorganic lead compounds for occupational risk assessment
Niemeier RT , Maier A , Reichard JF . Ann Work Expo Health 2022 66 (3) 291-311 Lead (Pb) exposure continues to be a significant public health issue in both occupational and non-occupational settings. The vast majority of exposure and toxicological studies have focused on effects related to inhalation and gastrointestinal exposure routes. Exposure to inorganic Pb compounds through dermal absorption has been less well studied, perhaps due to the assumption that the dermal pathway is a minor contributor to aggregate exposures to Pb compounds. The aim of this rapid review was to identify and evaluate published literature on dermal exposures to support the estimation of key percutaneous absorption parameters (Kp, flux, diffusion rate) for use in occupational risk assessment. Eleven articles were identified containing information from both in vitro and in vivo systems relevant to percutaneous absorption kinetics. These articles provided 24 individual study summaries and information for seven inorganic Pb compounds. The vast majority of study summaries evaluated (n = 22, 92%) reported detectable amounts of dermal absorption of inorganic Pb. Data were identified for four Pb compounds (Pb acetate, Pb nitrate, Pb oxide, and Pb metal) that may be sufficient to use in evaluating physiologically based pharmacokinetic models. Average calculated diffusion rates for the pool of animal and human skin data ranged from 10-7 to 10-4 mg cm-2 h-1, and Kp values ranged from 10-7 to 10-5 cm h-1. Study design and documentation were highly variable, and only one of the studies identified was conducted using standard test guideline-compliant methodologies. Two studies provided quality estimates on the impacts of dermal absorption from water-insoluble Pb compounds on blood Pb levels. These two studies reported that exposures via dermal routes could elevate blood Pb by over 6 µg dl-1. This estimation could represent over 100% of 5 µg dl-1, the blood Pb associated with adverse health effects in adults. The utility of these estimates to occupational dermal exposures is limited, because the confidence in the estimates is not high. The literature, while of limited quality, overall strongly suggests inorganic Pb has the potential for dermal uptake in meaningful amounts associated with negative health outcomes based on upper bound diffusion rate estimates. Future standard test guideline-compliant studies are needed to provide high-confidence estimates of dermal uptake. Such data are needed to allow for improved evaluation of Pb exposures in an occupational risk assessment context. |
HLA-DPB1 E69 genotype and exposure in beryllium sensitisation and disease
Crooks J , Mroz MM , VanDyke M , McGrath A , Schuler C , McCanlies EC , Virji MA , Rosenman KD , Rossman M , Rice C , Monos D , Fingerlin TE , Maier LA . Occup Environ Med 2021 79 (2) 120-126 OBJECTIVES: Human leukocyte antigen-DP beta 1 (HLA-DPB1) with a glutamic acid at the 69th position of the ß chain (E69) genotype and inhalational beryllium exposure individually contribute to risk of chronic beryllium disease (CBD) and beryllium sensitisation (BeS) in exposed individuals. This retrospective nested case-control study assessed the contribution of genetics and exposure in the development of BeS and CBD. METHODS: Workers with BeS (n=444), CBD (n=449) and beryllium-exposed controls (n=890) were enrolled from studies conducted at nuclear weapons and primary beryllium manufacturing facilities. Lifetime-average beryllium exposure estimates were based on workers' job questionnaires and historical and industrial hygienist exposure estimates, blinded to genotype and case status. Genotyping was performed using sequence-specific primer-PCR. Logistic regression models were developed allowing for over-dispersion, adjusting for workforce, race, sex and ethnicity. RESULTS: Having no E69 alleles was associated with lower odds of both CBD and BeS; every additional E69 allele increased odds for CBD and BeS. Increasing exposure was associated with lower odds of BeS. CBD was not associated with exposure as compared to controls, yet the per cent of individuals with CBD versus BeS increased with increasing exposure. No evidence of a gene-by-exposure interaction was found for CBD or BeS. CONCLUSIONS: Risk of CBD increases with E69 allele frequency and increasing exposure, although no gene by environment interaction was found. A decreased risk of BeS with increasing exposure and lack of exposure response in CBD cases may be due to the limitations of reconstructed exposure estimates. Although reducing exposure may not prevent BeS, it may reduce CBD and the associated health effects, especially in those carrying E69 alleles. |
COVID-19 Outbreak Among Three Affiliated Homeless Service Sites - King County, Washington, 2020.
Tobolowsky FA , Gonzales E , Self JL , Rao CY , Keating R , Marx GE , McMichael TM , Lukoff MD , Duchin JS , Huster K , Rauch J , McLendon H , Hanson M , Nichols D , Pogosjans S , Fagalde M , Lenahan J , Maier E , Whitney H , Sugg N , Chu H , Rogers J , Mosites E , Kay M . MMWR Morb Mortal Wkly Rep 2020 69 (17) 523-526 On March 30, 2020, Public Health - Seattle and King County (PHSKC) was notified of a confirmed case of coronavirus disease 2019 (COVID-19) in a resident of a homeless shelter and day center (shelter A). Residents from two other homeless shelters (B and C) used shelter A's day center services. Testing for SARS-CoV-2, the virus that causes COVID-19, was offered to available residents and staff members at the three shelters during March 30-April 1, 2020. Among the 181 persons tested, 19 (10.5%) had positive test results (15 residents and four staff members). On April 1, PHSKC and CDC collaborated to conduct site assessments and symptom screening, isolate ill residents and staff members, reinforce infection prevention and control practices, provide face masks, and advise on sheltering-in-place. Repeat testing was offered April 7-8 to all residents and staff members who were not tested initially or who had negative test results. Among the 118 persons tested in the second round of testing, 18 (15.3%) had positive test results (16 residents and two staff members). In addition to the 31 residents and six staff members identified through testing at the shelters, two additional cases in residents were identified during separate symptom screening events, and four were identified after two residents and two staff members independently sought health care. In total, COVID-19 was diagnosed in 35 of 195 (18%) residents and eight of 38 (21%) staff members who received testing at the shelter or were evaluated elsewhere. COVID-19 can spread quickly in homeless shelters; rapid interventions including testing and isolation to identify cases and minimize transmission are necessary. CDC recommends that homeless service providers implement appropriate infection control practices, apply physical distancing measures including ensuring resident's heads are at least 6 feet (2 meters) apart while sleeping, and promote use of cloth face coverings among all residents (1). |
A framework for integrating information resources for chemical emergency management and response
Seaton MG , Maier A , Sachdeva S , Barton C , Ngai E , Lentz TJ , Rane PD , McKernan LT . Am J Disaster Med 2019 14 (1) 33-49 Effective emergency management and response require appropriate utilization of various resources as an incident evolves. This manuscript describes the information resources used in chemical emergency management and operations and how their utility evolves from the initial response phase to recovery to event close out. The authors address chemical hazard guidance in the context of four different phases of emergency response: preparedness, emergency response (both initial and ongoing), recovery, and mitigation. Immediately following a chemical incident, during the initial response, responders often use readily available, broad-spectrum guidance to make rapid decisions in the face of uncertainties regarding potential exposure to physical and health hazards. Physical hazards are described as the hazards caused by chemicals that can cause harm with or without direct contact. Examples of physical hazards include explosives, flammables, and gases under pressure. This first line of resources may not be chemical-specific in nature, but it can provide guidance related to isolation distances, protective actions, and the most important physical and health threats. During the ongoing response phase, an array of resources can provide detailed information on physical and health hazards related to specific chemicals of concern. Consequently, risk management and mitigation actions evolve as well. When the incident stabilizes to a recovery phase, the types of information resources that facilitate safe and effective incident management evolve. Health and physical concerns transition from acute toxicity and immediate hazards to both immediate and latent health effects. Finally, the information inputs utilized during the preparedness phase include response evaluations of past events, emergency preparedness planning, and chemical-specific guidance about chemicals present. This manuscript details a framework for identifying the effective use of information resources at each phase and provides case study examples from chemical hazard emergencies. |
The Dermal Exposure Risk Management and Logic eTookit: Characterizing and managing dermal exposure during emergency management operations
Hudson NL , Dotson GS , Maier A . J Emerg Manag 2018 16 (3) 159-172 OBJECTIVE: Emergency management and operations (EMO) personnel require up-to-date information to make informed decisions during natural and man-made disasters. However, information gaps present challenges for accessing human health risk assessment and risk management strategies for dermal exposure. This article describes the development of a decision support system, the Dermal Exposure Risk Management and Logic (DERMaL) eToolkit. DESIGN: The DERMaL eToolkit provides information on key resources used in emergency incidents. Resources were classified according to response phase, resource categories, and information category and evaluated on reliability, accessibility, and preference by subject matter experts in emergency management fields. These rankings were used to generate a value of information score, unique for each resource, which aids in developing reference lists for users during each incident phase. RESULTS: This tool will identify and prioritize information resources on dermal risks, and can readily find the most relevant information to suit EMO needs. CONCLUSION: The DERMaL eToolkit can be used as an aid in finding information resources targeted to scenario-driven needs by providing well-vetted and prioritized resources related to dermal hazards, exposure, and risk assessments for EMO. |
Chemical-induced asthma and the role of clinical, toxicological, exposure and epidemiological research in regulatory and hazard characterization approaches
Vincent MJ , Bernstein JA , Basketter D , LaKind JS , Dotson GS , Maier A . Regul Toxicol Pharmacol 2017 90 126-132 Uncertainties in understanding all potential modes-of-action for asthma induction and elicitation hinders design of hazard characterization and risk assessment methods that adequately screen and protect against hazardous chemical exposures. To address this challenge and identify current research needs, the University of Cincinnati and the American Cleaning Institute hosted a webinar series to discuss the current state-of-science regarding chemical-induced asthma. The general consensus is that the available database, comprised of data collected from routine clinical and validated toxicological tests, is inadequate for predicting or determining causal relationships between exposures and asthma induction for most allergens. More research is needed to understand the mechanism of asthma induction and elicitation in the context of specific chemical exposures and exposure patterns, and the impact of population variability and patient phenotypes. Validated tools to predict respiratory sensitization and to translate irritancy assays to asthma potency are needed, in addition to diagnostic biomarkers that assess and differentiate allergy versus irritant-based asthmatic responses. Diagnostic methods that encompass the diverse etiologies of asthmatic responses and incorporate robust exposure measurements capable of capturing different temporal patterns of complex chemical mixtures are needed. In the absence of ideal tools, risk assessors apply hazard-based safety assessment methods, in conjunction with active risk management, to limit potential asthma concerns, proactively identify new concerns, and ensure deployment of approaches to mitigate asthma-related risks. |
What community-based HIV prevention organizations say about their role in biomedical HIV prevention
Smith DK , Maier E , Betts J , Gray S , Kolodziejski B , Hoover KW . AIDS Educ Prev 2016 28 (5) 426-439 Community-based organizations (CBOs) are critical to delivery of effective HIV prevention because of their reach to key populations. This online survey of a national sample of CBOs assessed their awareness of, interest in, and resources needed to provide nonoccupational postexposure prophylaxis (nPEP), preexposure prophylaxis (PrEP), and HIV treatment as prevention (TasP). One hundred seventy-five CBOs participated: 87 clinical and 88 nonclinical CBOs. For nPEP, PrEP, and TasP, program managers reported that awareness was high (94%, 90%, 85%), meeting current client need was low (20%, 13%, 18%), and the likelihood of increasing their current provision with additional resources was somewhat high (62%, 64%, 62%). Clinical CBOs were more prepared to support expansion of these biomedical interventions than nonclinical CBOs. Meeting the information, training, and resource needs of CBOs is critical for effective collaboration to reduce the number of new HIV infections through expanded delivery of PrEP, nPEP, and TasP. |
Multi-Center Evaluation of the Xpert Norovirus Assay for Detection of Norovirus GI and GII in Fecal Specimens.
Gonzalez MD , Langley LC , Buchan BW , Faron ML , Maier M , Templeton K , Walker K , Popowitch EB , Miller MB , Rao A , Liebert UG , Ledeboer NA , Vinje J , Burnham CA . J Clin Microbiol 2016 54 (1) 142-7 Norovirus is the most common cause of sporadic gastroenteritis and outbreaks worldwide. The rapid identification of norovirus has important implications for infection prevention measures and may reduce the need for additional diagnostic testing. The Xpert Norovirus assay recently received FDA clearance for the detection and differentiation of norovirus genogroups I and II (GI and GII), which account for the vast majority of infections. In this study, we evaluated the performance of the Xpert Norovirus assay with both fresh, prospectively collected (n = 914) and frozen, archived (n = 489) fecal specimens. A Centers for Disease Control and Prevention (CDC) composite reference method was used as the gold standard for comparison. For both prospective and frozen specimens, the Xpert Norovirus assay showed positive percent agreement (PPA) and negative percent agreement (NPA) values of 98.3% and 98.1% for GI and of 99.4% and 98.2% for GII, respectively. Norovirus prevalence in the prospective specimens (collected from March to May of 2014) was 9.9% (n = 90), with the majority of positives caused by genogroup II (82%, n = 74). The positive predictive value (PPV) of the Xpert Norovirus assay was 75% for GI-positive specimens, whereas it was 86.5% for GII-positive specimens. The negative predictive values (NPV) for GI and GII were 100% and 99.9%, respectively. |
State-of-the-science: the evolution of occupational exposure limit derivation and application
Maier A , Lentz TJ , MacMahon KL , McKernan LT , Whittaker C , Schulte PA . J Occup Environ Hyg 2015 12 Suppl 1 S4-6 Occupational exposure limits (OELs) are a critical component of the risk assessment and risk management process and their use remains a staple of occupational hygiene practice. There are dozens of organizations and agencies that derive OELs worldwide. Yet, while most of these groups describe their administrative procedures as well as the rationale for the derivation of OELs for individual substances, few provide equally complete documentation of the underlying scientific methodology for conducting the quantitative risk assessment employed in OEL development. The paucity of written descriptions of OEL development methodology has resulted in a lack of transparency related to implementation of important scientific principles for OEL development and inconsistent practices for OEL development within and among organizations. The absence of such transparency limits the opportunities for international harmonization of existing values and OEL setting practices among organizations. | Given these and other challenges, the National Institute for Occupational Safety and Health (NIOSH) began an effort to identify and characterize leading issues pertaining to OELs and their development through research which culminated in a collection of articles focused on each key issue. Those articles and the key issues they explore comprise this supplement of the Journal of Occupational and Environmental Hygiene. Utilizing subject matter expertise from researchers and thought leaders in the occupational hygiene profession and affiliated fields of environmental public health, the goal of this effort is to describe the issues related to education and communication of science principles and to understand how they can be incorporated into (and thereby impact) the practices of OEL development and interpretation. Focusing specifically on the state-of-the-science in the fields of exposure science, occupational hygiene, risk assessment, and toxicology this effort sought to provide a clear description of how advances in these research areas can contribute to the practice of OEL setting—by reviewing the methods used for most OELs that are currently available as well as new methods that are actively being incorporated in the OEL process. An essential topic included within the set of complementary and interrelated articles dedicated to this pursuit is the consideration and interpretation of OELs in the context of evolving risk management practices. The articles are intended to serve as a current critical review of occupational risk assessment methods that will enable occupational hygiene professionals to have a clear understanding of the science methods incorporated in the OELs they develop or use. A brief introduction to each article in this collection is provided in the following paragraphs. |
Setting occupational exposure limits for chemical allergens - understanding the challenges
Dotson GS , Maier A , Siegel PD , Anderson SE , Green BJ , Stefaniak AB , Codispoti CD , Kimber I . J Occup Environ Hyg 2015 12 Suppl 1 S82-98 Chemical allergens represent a significant health burden in the workplace. Exposures to such chemicals can cause the onset of a diverse group of adverse health effects triggered by immune-mediated responses. Common responses associated with workplace exposures to low molecular weight (LMW) chemical allergens range from allergic contact dermatitis to life-threatening cases of asthma. Establishing occupational exposure limits (OELs) for chemical allergens presents numerous difficulties for occupational hygiene professionals. Few OELs have been developed for LMW allergens because of the unique biological mechanisms that govern the immune-mediated responses. The purpose of this article is to explore the primary challenges confronting the establishment of OELs for LMW allergens. Specific topics include: (1) understanding the biology of LMW chemical allergies as it applies to setting OELs; (2) selecting the appropriate immune-mediated response (i.e., sensitization versus elicitation); (3) characterizing the dose (concentration)-response relationship of immune-mediated responses; (4) determining the impact of temporal exposure patterns (i.e., cumulative versus acute exposures); and (5) understanding the role of individual susceptibility and exposure route. Additional information is presented on the importance of using alternative exposure recommendations and risk management practices, including medical surveillance, to aid in protecting workers from exposures to LMW allergens when OELs cannot be established. |
Aggregate exposure and cumulative risk assessment-integrating occupational and non-occupational risk factors
Lentz TJ , Dotson GS , Williams PR , Maier A , Gadagbui B , Pandalai SP , Lamba A , Hearl F , Mumtaz M . J Occup Environ Hyg 2015 12 Suppl 1 S112-26 Occupational exposure limits have traditionally focused on preventing morbidity and mortality arising from inhalation exposures to individual chemical stressors in the workplace. While central to occupational risk assessment, occupational exposure limits have limited application as a refined disease prevention tool because they do not account for all of the complexities of the work and non-occupational environments and are based on varying health endpoints. To be of greater utility, occupational exposure limits and other risk management tools could integrate broader consideration of risks from multiple exposure pathways and routes (aggregate risk) as well as the combined risk from exposure to both chemical and non-chemical stressors, within and beyond the workplace, including the possibility that such exposures may cause interactions or modify the toxic effects observed (cumulative risk). Although still at a rudimentary stage in many cases, a variety of methods and tools have been developed or are being used in allied risk assessment fields to incorporate such considerations in the risk assessment process. These approaches, which are collectively referred to as cumulative risk assessment, have potential to be adapted or modified for occupational scenarios and provide a tangible path forward for occupational risk assessment. Accounting for complex exposures in the workplace and the broader risks faced by the individual also requires a more complete consideration of the composite effects of occupational and non-occupational risk factors to fully assess and manage worker health problems. Barriers to integrating these different factors remain, but new and ongoing community-based and worker health-related initiatives may provide mechanisms for identifying and integrating risk from aggregate exposures and cumulative risks from all relevant sources, be they occupational or non-occupational. |
A decision support framework for characterizing and managing dermal exposures to chemicals during emergency management and operations
Dotson GS , Hudson NL , Maier A . J Emerg Manag 2015 13 (4) 359-380 Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management. |
Exposure estimation and interpretation of occupational risk: enhanced information for the occupational risk manager
Waters M , McKernan L , Maier A , Jayjock M , Schaeffer V , Brosseau L . J Occup Environ Hyg 2015 12 Suppl 1 0 The fundamental goal of this paper is to describe, define and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve - the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability - rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: 1.) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration, 2.) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate, 3.) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. |
The global landscape of occupational exposure limits-implementation of harmonization principles to guide limit selection
Deveau M , Chen CP , Johanson G , Krewski D , Maier A , Niven KJ , Ripple S , Schulte PA , Silk J , Urbanus JH , Zalk DM , Niemeier RW . J Occup Environ Hyg 2015 12 Suppl 1 0 Occupational exposure limits (OELs) serve as health-based benchmarks against which measured or estimated workplace exposures can be compared. In the years since the introduction of OELs to public health practice, both developed and developing countries have established processes for deriving, setting, and using OELs to protect workers exposed to hazardous chemicals. These processes vary widely, however, and have thus resulted in a confusing international landscape for identifying and applying such limits in workplaces. The occupational hygienist will encounter significant overlap in coverage among organizations for many chemicals, while other important chemicals have OELs developed by few, if any, organizations. Where multiple organizations have published an OEL, the derived value often varies considerably-reflecting differences in both risk policy and risk assessment methodology as well as access to available pertinent data. This paper explores the underlying reasons for variability in OELs, and recommends the harmonization of risk-based methods used by OEL-deriving organizations. A framework is also proposed for the identification and systematic evaluation of OEL resources, which occupational hygienists can use to support risk characterization and risk management decisions in situations where multiple potentially relevant OELs exist. |
The scientific basis of uncertainty factors used in setting occupational exposure limits
Dankovic DA , Naumann BD , Maier A , Dourson ML , Levy LS . J Occup Environ Hyg 2015 12 Suppl 1 0 The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties - typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also has the benefit of increasing the transparency of occupational exposure limit derivation. Improved characterization of the scientific basis for uncertainty factors has led to increasing rigor and transparency in their application as part of the overall occupational exposure limit derivation process. |
The draft genome sequence of the ferret (Mustela putorius furo) facilitates study of human respiratory disease.
Peng X , Alfoldi J , Gori K , Eisfeld AJ , Tyler SR , Tisoncik-Go J , Brawand D , Law GL , Skunca N , Hatta M , Gasper DJ , Kelly SM , Chang J , Thomas MJ , Johnson J , Berlin AM , Lara M , Russell P , Swofford R , Turner-Maier J , Young S , Hourlier T , Aken B , Searle S , Sun X , Yi Y , Suresh M , Tumpey TM , Siepel A , Wisely SM , Dessimoz C , Kawaoka Y , Birren BW , Lindblad-Toh K , Di Palma F , Engelhardt JF , Palermo RE , Katze MG . Nat Biotechnol 2014 32 (12) 1250-5 The domestic ferret (Mustela putorius furo) is an important animal model for multiple human respiratory diseases. It is considered the 'gold standard' for modeling human influenza virus infection and transmission. Here we describe the 2.41 Gb draft genome assembly of the domestic ferret, constituting 2.28 Gb of sequence plus gaps. We annotated 19,910 protein-coding genes on this assembly using RNA-seq data from 21 ferret tissues. We characterized the ferret host response to two influenza virus infections by RNA-seq analysis of 42 ferret samples from influenza time-course data and showed distinct signatures in ferret trachea and lung tissues specific to 1918 or 2009 human pandemic influenza virus infections. Using microarray data from 16 ferret samples reflecting cystic fibrosis disease progression, we showed that transcriptional changes in the CFTR-knockout ferret lung reflect pathways of early disease that cannot be readily studied in human infants with cystic fibrosis disease. |
Cumulative Risk Assessment (CRA): transforming the way we assess health risks
Williams PR , Dotson GS , Maier A . Environ Sci Technol 2012 46 (20) 10868-74 Human health risk assessments continue to evolve and now focus on the need for cumulative risk assessment (CRA). CRA involves assessing the combined risk from coexposure to multiple chemical and nonchemical stressors for varying health effects. CRAs are broader in scope than traditional chemical risk assessments because they allow for a more comprehensive evaluation of the interaction between different stressors and their combined impact on human health. Future directions of CRA include greater emphasis on local-level community-based assessments; integrating environmental, occupational, community, and individual risk factors; and identifying and implementing common frameworks and risk metrics for incorporating multiple stressors. |
Chronic beryllium disease, HLA-DPB1, and the DP peptide binding groove
Silveira LJ , McCanlies EC , Fingerlin TE , Van Dyke MV , Mroz MM , Strand M , Fontenot AP , Bowerman N , Dabelea DM , Schuler CR , Weston A , Maier LA . J Immunol 2012 189 (8) 4014-23 Multiple epidemiologic studies demonstrate associations between chronic beryllium disease (CBD), beryllium sensitization (BeS), and HLA-DPB1 alleles with a glutamic acid residue at position 69 (E69). Results suggest that the less-frequent E69 variants (non-*0201/*0202 alleles) might be associated with greater risk of CBD. In this study, we sought to define specific E69-carrying alleles and their amino acid sequences in the DP peptide binding groove, as well as their relationship to CBD and BeS risk, using the largest case control study to date. We enrolled 502 BeS/CBD subjects and 653 beryllium-exposed controls from three beryllium industries who gave informed consent for participation. Non-Hispanic white cases and controls were frequency-matched by industry. HLA-DPB1 genotypes were determined using sequence-specific primer PCR. The E69 alleles were tested for association with disease individually and grouped by amino acid structure using logistic regression. The results show that CBD cases were more likely than controls to carry a non-*02 E69 allele than an *02 E69, with odds ratios (95% confidence interval) ranging from 3.1 (2.1-4.5) to 3.9 (2.6-5.9) (p < 0.0001). Polymorphic amino acids at positions 84 and 11 were associated with CBD: DD versus GG, 2.8 (1.8-4.6), p < 0.0001; GD versus GG, 2.1 (1.5-2.8), p < 0.0001; LL versus GG, 3.2 (1.8-5.6), p < 0.0001; GL versus GG, 2.8 (2.1-3.8), p < 0.0001. Similar results were found within the BeS group and CBD/BeS combined group. We conclude that the less frequent E69 alleles confer more risk for CBD than does *0201. Recent studies examining how the composition and structure of the binding pockets influence peptide binding in MHC genes, as well of studies showing the topology of the TCR to likely bind DPB1 preferentially, give plausible biological rationale for these findings. |
Risk assessment's new era, part 1: challenges for industrial hygiene
Dotson GS , Rossner A , Maier A , Boelter FW . Synergist (Akron) 2012 23 (4) 24-26 Risk is an inherent aspect of our lives. Whether the topic is the nation's dietary habits, community air pollution or chemical exposures in the workplace, risk analysis is an integral part of the conversation. | Risk analysis is the combined activities of assessing, managing and communicating human health risks. Interest in understanding risk from chemical exposures and other stressors has led to the formalization of health risk assessment as an applied public health science. Numerous seminal reports from the National Academies of Science (NAS) have highlighted the framework for risk assessment and risk management, as well as key changes within the practice of risk analysis. | The profession of industrial hygiene has evolved to reflect the changes in health risk assessment methodology and practice. Traditional industrial hygiene practice−the anticipation, recognition, evaluation and control of occupational and environmental hazards and risks–parallels key aspects of health risk assessment. Thus, industrial hygienists have a strong history as leading practitioners of all aspects of risk analysis–health risk assessment, risk management and risk communication–within the occupational environment. |
Risk assessment's new era, part 2: evolving methods and future directions
Williams PRD , Dotson GS , Maier A . Synergist (Akron) 2012 23 (5) 46-48 For more than three decades, health practitioners and regulatory agencies have used risk assessment methods to characterize health risks. Risk assessment is the process of determining the likelihood and severity of health risk to an individual or population from exposure to a chemical or other stressor. Evolving methods and advances in science and technology offer several opportunities for improving risk assessment and its application to occupational settings. |
Efficacy of predictive modeling as a scientific criterion in dermal hazard identification for assignment of skin notations
Chen CP , Ahlers HW , Dotson GS , Lin YC , Chang WC , Maier A , Gadagbui B . Regul Toxicol Pharmacol 2011 61 (1) 63-72 Skin notations (SNs) represent a hazard characterization tool for alerting workers of health hazards associated with dermal contact with chemicals. This study evaluated the efficacy of a predictive model utilized by the National Institute for Occupational Safety and Health to identify dermal hazards based on potential of systemic absorption compared to hazard assignments based on dermal lethal dose 50% or logarithm of octanol-water partition coefficient. A total of 480 chemicals assigned an SN from at least one of seven institutes were selected and partitioned into seven hazard categories by frequency of SN assignment to provide a basis of evaluation for the predictivity of the examined criteria. We find that all three properties serve as a qualitative indicator in support of a dichotomous decision on dermal hazard; the predictive modeling was identified from a multiple regression analysis as the most significant indicator. The model generated estimates that corresponded to anticipated hazard potentials, suggesting a role of the model to further serve as a hazard-ranking tool. The hazard-ranking capability of the model was consistent with the scheme of acute toxicity classification in the Globally Harmonized System of Classification and Labeling of Chemicals. |
The evolution of skin notations for occupational risk assessment: a new NIOSH strategy
Dotson GS , Chen CP , Gadagbui B , Maier A , Ahlers HW , Lentz TJ . Regul Toxicol Pharmacol 2011 61 (1) 53-62 This article presents an overview of a strategy for assignment of hazard-specific skin notations (SK), developed by the National Institute for Occupational Safety and Health (NIOSH). This health hazard characterization strategy relies on multiple SKs capable of delineating systemic (SYS), direct (DIR), and immune-mediated (SEN) adverse effects caused by dermal exposures to chemicals. One advantage of the NIOSH strategy is the ability to combine SKs when it is determined that a chemical may cause multiple adverse effects following dermal contact (e.g., SK: SYS-DIR-SEN). Assignment of the SKs is based on a weight-of-evidence (WOE) approach, which refers to the critical examination of all available data from diverse lines of evidence and the derivation of a scientific interpretation based on the collective body of data including its relevance, quality, and reported results. Numeric cutoff values, based on indices of toxic potency, serve as guidelines to aid in consistently determining a chemical's relative toxicity and hazard potential. The NIOSH strategy documents the scientific rationale for determination of the hazard potential of a chemical and the subsequent assignment of SKs. A case study of acrylamide is presented as an application of the NIOSH strategy. |
Interpreting borderline BeLPT results
Middleton DC , Mayer AS , Lewin MD , Mroz MM , Maier LA . Am J Ind Med 2010 54 (3) 205-9 BACKGROUND: The beryllium lymphocyte proliferation test (BeLPT) identifies persons sensitized to beryllium (BeS) and thus at risk for chronic beryllium disease (CBD). BeLPT test results are abnormal (AB), borderline (BL), or normal (NL). This manuscript addresses the predictive value and interpretation of BL BeLPT results. METHODS: The various three-result combinations that meet or exceed a nominal referral criteria of 1 AB + 1 BL are assessed with probability modeling and compared. RESULTS: At 2% prevalence, the three-result combinations that meet or exceed this referral criteria and associated probabilities of BeS are: (a) 1 AB + 1 BL + 1 NL (72%); (b) 3 BL (91%); (c) 2 AB + 1 NL (95%); (d) 1 AB + 2 BL (99%); (e) 2 AB + 1 BL (100%); and (f) 3 AB (100%). CONCLUSION: These results suggest that BL results are meaningful and that three BL results predict BeS across a broad range of population prevalences. An analysis of longitudinal BeLPT results and clinical findings from an actual surveillance program is warranted to confirm the model's predictions. Am. J. Ind. Med. (c) 2010 Wiley-Liss, Inc. |
A Bayesian network model for biomarker-based dose response
Hack CE , Haber LT , Maier A , Shulte P , Fowler B , Lotz WG , Savage Jr RE . Risk Anal 2010 30 (7) 1037-51 A Bayesian network model was developed to integrate diverse types of data to conduct an exposure-dose-response assessment for benzene-induced acute myeloid leukemia (AML). The network approach was used to evaluate and compare individual biomarkers and quantitatively link the biomarkers along the exposure-disease continuum. The network was used to perform the biomarker-based dose-response analysis, and various other approaches to the dose-response analysis were conducted for comparison. The network-derived benchmark concentration was approximately an order of magnitude lower than that from the usual exposure concentration versus response approach, which suggests that the presence of more information in the low-dose region (where changes in biomarkers are detectable but effects on AML mortality are not) helps inform the description of the AML response at lower exposures. This work provides a quantitative approach for linking changes in biomarkers of effect both to exposure information and to changes in disease response. Such linkage can provide a scientifically valid point of departure that incorporates precursor dose-response information without being dependent on the difficult issue of a definition of adversity for precursors. |
Management and outcomes after multiple corneal and solid organ transplantations from a donor infected with rabies virus
Maier T , Schwarting A , Mauer D , Ross RS , Martens A , Kliem V , Wahl J , Panning M , Baumgarte S , Muller T , Pfefferle S , Ebel H , Schmidt J , Tenner-Racz K , Racz P , Schmid M , Struber M , Wolters B , Gotthardt D , Bitz F , Frisch L , Pfeiffer N , Fickenscher H , Sauer P , Rupprecht CE , Roggendorf M , Haverich A , Galle P , Hoyer J , Drosten C . Clin Infect Dis 2010 50 (8) 1112-9 BACKGROUND: This article describes multiple transmissions of rabies via transplanted solid organ from a single infected donor. The empirical Milwaukee treatment regimen was used in the recipients. METHODS: Symptomatic patients were treated by deep sedation (ketamine, midazolam, and phenobarbital), ribavirin, interferon, and active and passive vaccination. Viral loads and antibodies were continuously monitored. RESULTS: Recipients of both cornea and liver transplants developed no symptoms. The recipient of the liver transplant had been vaccinated approximately 20 years before transplantation. Two recipients of kidney and lung transplants developed rabies and died within days of symptomatic disease. Another kidney recipient was treated 7 weeks before he died. The cerebrospinal fluid viral load remained at constant low levels (<10,000 copies/mL) for approximately 5 weeks; it increased suddenly by almost 5 orders of magnitude thereafter. After death, no virus was found in peripheral compartments (nerve tissue, heart, liver, or the small intestine) in this patient, in contrast to in patients in the same cohort who died early. CONCLUSIONS: Our report includes, to our knowledge, the longest documented treatment course of symptomatic rabies and the first time that the virus concentration was measured over time and in different body compartments. The postmortem virus concentration in the periphery was low, but there was no evidence of a reduction of virus in the brain. |
- Page last reviewed:Feb 1, 2024
- Page last updated:May 06, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure